Approximating Combined Discrete Total Variation and Correlated Sparsity With Convex Relaxations
نویسندگان
چکیده
The recently introduced k-support norm has been successfully applied to sparse prediction problems with correlated features. This norm however lacks any explicit structural constraints commonly found in machine learning and image processing. We address this problem by incorporating a total variation penalty in the k-support framework. We introduce the (k, s) support total variation norm as the tightest convex relaxation of the intersection of a set of discrete sparsity and total variation penalties. We show that this norm leads to an intractable combinatorial graph optimization problem, which we prove to be NP-hard. We then introduce a tractable relaxation with approximation guarantees. We demonstrate the effectiveness of this penalty on classification in the low-sample regime, M/EEG neuroimaging analysis, and background subtracted image recovery.
منابع مشابه
Structured Sparsity: Discrete and Convex approaches
Compressive sensing (CS) exploits sparsity to recover sparse or compressible signals from dimensionality reducing, non-adaptive sensing mechanisms. Sparsity is also used to enhance interpretability in machine learning and statistics applications: While the ambient dimension is vast in modern data analysis problems, the relevant information therein typically resides in a much lower dimensional s...
متن کاملDual Decomposition for Joint Discrete-Continuous Optimization
We analyse convex formulations for combined discrete-continuous MAP inference using the dual decomposition method. As a consquence we can provide a more intuitive derivation for the resulting convex relaxation than presented in the literature. Further, we show how to strengthen the relaxation by reparametrizing the potentials, hence convex relaxations for discrete-continuous inference does not ...
متن کاملConvex Clustering via Optimal Mass Transport
We consider approximating distributions within the framework of optimal mass transport and specialize to the problem of clustering data sets. Distances between distributions are measured in the Wasserstein metric. The main problem we consider is that of approximating sample distributions by ones with sparse support. This provides a new viewpoint to clustering. We propose different relaxations o...
متن کاملThe comparison of two high-order semi-discrete central schemes for solving hyperbolic conservation laws
This work presents two high-order, semi-discrete, central-upwind schemes for computing approximate solutions of 1D systems of conservation laws. We propose a central weighted essentially non-oscillatory (CWENO) reconstruction, also we apply a fourth-order reconstruction proposed by Peer et al., and afterwards, we combine these reconstructions with a semi-discrete central-upwind numerical flux ...
متن کاملSparse projections onto the simplex
Most learning methods with rank or sparsity constraints use convex relaxations, which lead to optimization with the nuclear norm or the `1-norm. However, several important learning applications cannot benefit from this approach as they feature these convex norms as constraints in addition to the non-convex rank and sparsity constraints. In this setting, we derive efficient sparse projections on...
متن کامل